Boosting with the L_2-Loss: Regression and Classification

نویسنده

  • Bin Yu
چکیده

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

High-Dimensional $L_2$Boosting: Rate of Convergence

Boosting is one of the most significant developments in machine learning. This paper studies the rate of convergence of L2Boosting, which is tailored for regression, in a high-dimensional setting. Moreover, we introduce so-called “post-Boosting”. This is a post-selection estimator which applies ordinary least squares to the variables selected in the first stage by L2Boosting. Another variant is...

متن کامل

Functional Frank-Wolfe Boosting for General Loss Functions

Boosting is a generic learning method for classification and regression. Yet, as the number of base hypotheses becomes larger, boosting can lead to a deterioration of test performance. Overfitting is an important and ubiquitous phenomenon, especially in regression settings. To avoid overfitting, we consider using l1 regularization. We propose a novel Frank-Wolfe type boosting algorithm (FWBoost...

متن کامل

Machine Learning Models for Housing Prices Forecasting using Registration Data

This article has been compiled to identify the best model of housing price forecasting using machine learning methods with maximum accuracy and minimum error. Five important machine learning algorithms are used to predict housing prices, including Nearest Neighbor Regression Algorithm (KNNR), Support Vector Regression Algorithm (SVR), Random Forest Regression Algorithm (RFR), Extreme Gradient B...

متن کامل

Smooth ε-Insensitive Regression by Loss Symmetrization

We describe a framework for solving regression problems by reduction to classification. Our reduction is based on symmetrization of margin-based loss functions commonly used in boosting algorithms, namely, the logistic-loss and the exponential-loss. Our construction yields a smooth version of the ε-insensitive hinge loss that is used in support vector regression. Furthermore, this construction ...

متن کامل

Multi-class Boosting

This paper briefly surveys existing methods for boosting multi-class classication algorithms, as well as compares the performance of one such implementation, Stagewise Additive Modeling using a Multi-class Exponential loss function (SAMME), against that of Softmax Regression, Classification and Regression Trees, and Neural Networks.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001